|
In information theory, Pinsker's inequality, named after its inventor Mark Semenovich Pinsker, is an inequality that bounds the total variation distance (or statistical distance) in terms of the Kullback–Leibler divergence. The inequality is tight up to constant factors. ==Formal statement== Pinsker's inequality states that, if and are two probability distributions on a measurable space , then : where : is the total variation distance (or statistical distance) between and and : is the Kullback–Leibler divergence in nats. When the sample space is a finite set, the Kullback–Leibler divergence is given by : Note that in terms of the total variation norm of the signed measure , Pinsker's inequality differs from the one given above by a factor of two: : The proof of Pinsker's inequality uses the partition inequality for ''f''-divergences. 抄文引用元・出典: フリー百科事典『 ウィキペディア(Wikipedia)』 ■ウィキペディアで「Pinsker's inequality」の詳細全文を読む スポンサード リンク
|